فهرست مطالب

Journal of Statistical Modelling: Theory and Applications
Volume:3 Issue: 2, Summer and Autumn 2022

  • تاریخ انتشار: 1402/09/20
  • تعداد عناوین: 12
|
  • Elham Basiri *, Elham Hosseinzadeh Pages 1-14
    This paper considers the Type I hybrid censoring and investigates the optimal value for the sample size which is assumed as a truncated binomial random variable‎. ‎Rayleigh distribution is considered for the lifetime distribution‎. ‎Towards this end‎, ‎various factors can be considered and the most important is the sampling cost criterion‎. ‎Since the sample size is a random variable‎, ‎the optimal parameter of the random sample size is determined so that the total cost of the test does not exceed a pre-determined value‎. ‎Numerical calculations and a simulation study have been performed to evaluate the obtained results‎. ‎Finally‎, ‎the conclusion of the article is presented.
    Keywords: Cost criterion, Optimal sample size, Type I hybrid censoring
  • Ahad Malekzadeh, Fatemeh Rajabi Naraki * Pages 15-29
    In order to investigate a series of data scenarios and determine the model governing the changes of a random variable over time‎, ‎according to the variables affecting it‎, ‎efficient methods have been developed in recent decades‎. ‎One of these methods is the generalized additive model‎. ‎By this modeling for data‎, ‎it is possible to check the behavior of the non-linear data and even predict the future‎. ‎In this article‎, ‎we intend to express this method non-parametrically‎, ‎in cases such as when the variable is independent‎, ‎time series‎, ‎or has a lag and implement the estimation of model parameters‎. ‎Moreover‎, ‎we will demonstrate the power and effectiveness of this method by presenting some examples.
    Keywords: Distributed lag models, Generalized Additive Model, Generalized linear model, Penalized likelihood, Smooth function, Splines
  • AliAkbar Jafari * Pages 31-38

    In this paper‎, ‎we consider a diagonal form for the variances of errors in linear models‎. ‎This form contains the homogeneous and heterogeneous for the errors‎. ‎First‎, ‎an estimation for the variances is given‎, ‎and then a method is introduced for the hypothesis test of parameters in linear models‎. ‎Some applications of this method are presented.

    Keywords: Behrens-Fisher, Generalized p-value, heterogeneous, Linear model, One-way ANOVA
  • Masoud Yarmohammadi *, Ada Afshar, Rahim Mahmoudvand, Parviz Nasiri Pages 39-50
    The nonhomogeneous Poisson process is commonly utilized to model the occurrence of events over time‎. ‎The identification of nonhomogeneous Poisson process relies on the intensity function‎, ‎which can be difficult to determine‎. ‎A straightforward approach is to set the intensity function to a constant value‎, ‎resulting in a homogeneous Poisson process‎. ‎However‎, ‎it is crucial to assess the homogeneity of the intensity function through an appropriate test beforehand‎. ‎Failure to confirm homogeneity leads to an infinite-dimensional problem that cannot be comprehensively resolved‎. ‎In this study‎, ‎we analyzed data on the number of passengers using the Tehran metro‎. ‎Our homogeneity test showed a nonhomogeneous arrival rate of passengers‎, ‎prompting us to explore different functions to estimate the intensity function‎. ‎We considered four functions and used a piecewise function to determine the best intensity function‎. ‎Our findings showed significant differences between the two models‎, ‎highlighting the effectiveness of the piecewise function model in predicting the number of metro passengers.
    Keywords: Hypothesis testing, Intensity Function, Nonhomogeneous Poisson process, Poisson process
  • Abouzar Bazyari * Pages 51-70
    The main focus of this paper is to extend the analysis of some ruin related problems to a class of state-space compound binomial risk models for a sequence of independent and identically distributed random variables of interclaim times when the claim occurrences are homogeneous‎. ‎First‎, ‎we obtain the mass function of a defective renewal sequence of random {Fn }n≥0 -stopping times‎, ‎using the compound binomial of aggregate claim amount together the net profit condition‎, ‎and compute the infinite time ruin probability with Markov property of risk process‎. ‎Moreover‎, ‎we derive the distribution of the time to ruin among many random variables associated with ruin using the convolution of claim amount and Lagrange’s implicit function theorem‎. ‎Lastly‎, ‎the theoretical results are illustrated with numerical computations‎.
    Keywords: Compound binomial risk model, Homogenous claim occurrences, Ruin probability, Time to ruin
  • Zohreh Pakdaman *, Reza Alizadeh Noughabi Pages 71-83
    In this paper‎, ‎the problem of inferencing the stress-strength reliability under the ranked set sampling and the simple random sampling from the levy distribution function is investigated‎. ‎The maximum likelihood estimators‎, ‎their asymptotic distributions‎, ‎and Bayes estimators are provided for the stress-strength reliability parameter‎. ‎Furthermore‎, ‎using a Monte Carlo simulation‎, ‎for both sampling methods‎, ‎namely‎, ‎simple random sampling and ranked set sampling‎, ‎the Bayes risk estimators and the efficiency of the obtained estimators are computed and compared.
    Keywords: Maximum likelihood estimator, Ranked set sampling, Stress-strength reliability
  • Abouzar Bazyari * Pages 85-101
    The present paper considers a discrete-time risk model with a homogeneous‎, ‎irreducible‎, ‎and aperiodic Markov chain‎. ‎The general distribution of total claim amounts is influenced by the environmental Markov chain and in the i-th period the individual claim sizes are conditionally independent‎. ‎We obtain the recursive formulae for infinite time ruin probability using the technique of ordinary generating functions‎. ‎In addition‎, ‎we give some restrictions which under those the ruin will not happen‎. ‎In the last part‎, ‎we present some numerical illustrations for the results and give the practical problem through a fully developed case study in the domain of social insurance
    Keywords: Discrete-time risk model, Homogeneous Markov chain, Ruin probability, Stationary distribution, Transition probability matrix
  • Vahid Nekoukhou * Pages 103-117
    The two-parameter discrete Weibull distribution is an important model especially in reliability studies when the data are reported on a discrete scale‎. ‎The hazard rate function of a discrete Weibull distribution is monotonically increasing and decreasing‎. ‎The present paper provides a family of parametric discrete distributions which is an infinite mixture of exponentiated discrete Weibull distributions‎, ‎and versatile in fitting increasing‎, ‎decreasing‎, ‎and bathtub-shaped failure rate models to different discrete life-test data‎. ‎Some important distributional properties of the model such as the moments‎, ‎order statistics‎, ‎and infinite divisibility are investigated and the parameters of the distribution are estimated by the maximum likelihood method‎. ‎In addition‎, ‎a real data set is analyzed to show the effectiveness of the model‎. ‎Finally we conclude the paper.
    Keywords: Discrete univariate model, Infinite divisibility, Maximum likelihood estimation, Order statistics
  • Adeleh Fallah * Pages 119-143
    In this paper‎, ‎we consider a k-component coherent system while the system lifetimes are observed‎, ‎the system structure is known and the component lifetime follows the proportional hazard rate model‎. ‎We discuss the prediction problem based on Type-II censored coherent system lifetime data‎. ‎For predicting the future system failures‎, ‎we obtain the maximum likelihood predictor‎, ‎the best unbiased predictor‎, ‎the conditional median predictor and the Bayesian predictors‎. ‎As it seems that the integrals of the Bayes prediction do not possess closed forms‎, ‎the Metropolis-Hastings method is applied to approximating these integrals‎. ‎Different interval predictors based on classical and Bayesian approaches are derived‎. ‎A numerical example is presented to illustrate the prediction methods used in this paper‎. ‎A Monte Carlo simulation study is performed to evaluate and compare the performances of different prediction methods.
    Keywords: ‎Bayesian predictor, Best unbiased predictor, Conditional median predictor, Maximum likelihood predictor, Prediction intervals
  • Conditions for interior based constrained prior distributions to ensure probability density
    Amirhossein Ghatari, Elham Tabrizi * Pages 145-155
    In Bayesian inference‎, ‎the acquisition of prior distributions plays a fundamental role‎. ‎While authorized priors need not conform to traditional probability densities and may be improper priors‎, ‎obtaining proper prior densities remains a challenge in the Bayesian literature‎. ‎This article explores a set of conditions that enable the establishment of specific assumptions‎, ‎ensuring that maximum entropy priors and restricted reference priors become proper and transform into probability density priors‎. ‎By examining these conditions‎, ‎this study contributes to the advancement of proper prior acquisition in Bayesian analysis.
    Keywords: Constrained prior, Jensen inequality, Maximum entropy prior, Restricted reference priors
  • Fatemeh Hassantabar Darzi *, Firoozeh Haghighi, Samaneh Eftekhari Mahabadi Pages 157-167
    In designing an optimal life-testing experiment under a censoring setup‎, ‎the removal vector scheme is usually chosen by optimizing a suitable criterion function‎. ‎The criterion functions are usually constructed based on cost or variance functions‎, ‎and sometimes a combination of both‎. ‎This paper considers a multiple optimization problem in the context of Type-II progressive censoring with random dependent removal lifetime experiment‎. ‎A simple simulation algorithm is presented for obtaining the optimal scheme in a multi-objective optimal problem under the Type-II progressive censoring with random dependent removal model‎. ‎Several simulation studies are conducted to evaluate and compare the performance of the proposed strategy‎. ‎Finally‎, ‎some concluding remarks and future works are provided.
    Keywords: Cost function, Dependent random removal mechanism, Multi-objective optimal design
  • Roshanak Alimohammadi * Pages 169-173
    The sinusoidal model has many applications in time series analysis‎, ‎signal processing‎, ‎regression‎, ‎and other phenomena that are repeated periodically‎. ‎On the other hand‎, ‎smoothing spline is a flexible and useful method in many fields‎. ‎In this article‎, ‎smoothing spline is applied to interpolate data generated from the sinusoidal model‎. ‎Therefore‎, ‎a sinusoidal model is considered in three general forms‎. ‎Then‎, ‎in a simulation study‎, ‎data sets are generated from each of the sinusoidal model forms‎, ‎and the effect of changing the model components is assessed‎. ‎Besides‎, ‎the smoothing spline method is applied to estimate the related sinusoidal model‎, ‎and the performance of the smoothing spline for fitting a proper model to the sinusoidal data is studied‎. Furthermore‎, ‎by fitting a proper sinusoidal model to each generated data set‎, ‎the performance of smoothing spline is compared with the sinusoidal model‎. The ‎sum of squares error criterion is applied to compare the performance of models‎. ‎The simulation results illustrate that smoothing spline has better performance for model fitting to sinusoidal data.
    Keywords: Amplitude, Angular frequency, Phase, Sine model, Spline